منابع مشابه
Tables, Memorized Semirings and Applications
The following is intended to be a contribution in the area of what could be called e cient algebraic structures or e cient data structures. In fact, we de ne and construct a new data structure, the tables, which are special kinds of two-raws arrays. The rst raw is lled with words and the second with some coe cients. This structure generalizes the ( nite) k-sets sets of Eilenberg [6], it is vers...
متن کاملProximal Backpropagation
We propose proximal backpropagation (ProxProp) as a novel algorithm that takes implicit instead of explicit gradient steps to update the network parameters during neural network training. Our algorithm is motivated by the step size limitation of explicit gradient descent, which poses an impediment for optimization. ProxProp is developed from a general point of view on the backpropagation algori...
متن کاملNonlinear backpropagation: doing backpropagation without derivatives of the activation function
The conventional linear backpropagation algorithm is replaced by a nonlinear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the nonlinear backpropagation algorithms in the framework of recurrent backpropagation and present some numerical simulations of fee...
متن کاملQuick fuzzy backpropagation algorithm
A modification of the fuzzy backpropagation (FBP) algorithm called QuickFBP algorithm is proposed, where the computation of the net function is significantly quicker. It is proved that the FBP algorithm is of exponential time complexity, while the QuickFBP algorithm is of polynomial time complexity. Convergence conditions of the QuickFBP, resp. the FBP algorithm are defined and proved for: (1) ...
متن کاملConceptor-aided Backpropagation
Catastrophic interference has been a major roadblock in the research of continual learning. Here we propose a variant of the back-propagation algorithm, “conceptor-aided backprop” (CAB), in which gradients are shielded by conceptors against degradation of previously learned tasks. Conceptors have their origin in reservoir computing, where they have been previously shown to overcome catastrophic...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neurocomputing
سال: 2020
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2020.08.055